26 research outputs found

    Object-oriented programming with mixins in Ada

    Get PDF
    Recently, I wrote a paper discussing the lack of 'true' object-oriented programming language features in Ada 83, why one might desire them in Ada, and how they might be added in Ada 9X. The approach I took in this paper was to build the new object-oriented features of Ada 9X as much as possible on the basic constructs and philosophy of Ada 83. The object-oriented features proposed for Ada 9X, while different in detail, are based on the same kind of approach. Further consideration of this approach led me on a long reflection on the nature of object-oriented programming and its application to Ada. The results of this reflection, presented in this paper, show how a fairly natural object-oriented style can indeed be developed even in Ada 83. The exercise of developing this style is useful for at least three reasons: (1) it provides a useful style for programming object-oriented applications in Ada 83 until new features become available with Ada 9X; (2) it demystifies many of the mechanisms that seem to be 'magic' in most object-oriented programming languages by making them explicit; and (3) it points out areas that are and are not in need of change in Ada 83 to make object-oriented programming more natural in Ada 9X. In the next four sections I will address in turn the issues of object-oriented classes, mixins, self-reference and supertyping. The presentation is through a sequence of examples. This results in some overlap with that paper, but all the examples in the present paper are written entirely in Ada 83. I will return to considerations for Ada 9X in the last section of the paper

    Towards a general object-oriented software development methodology

    Get PDF
    An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered

    A general model for attitude determination error analysis

    Get PDF
    An overview is given of a comprehensive approach to filter and dynamics modeling for attitude determination error analysis. The models presented include both batch least-squares and sequential attitude estimation processes for both spin-stabilized and three-axis stabilized spacecraft. The discussion includes a brief description of a dynamics model of strapdown gyros, but it does not cover other sensor models. Model parameters can be chosen to be solve-for parameters, which are assumed to be estimated as part of the determination process, or consider parameters, which are assumed to have errors but not to be estimated. The only restriction on this choice is that the time evolution of the consider parameters must not depend on any of the solve-for parameters. The result of an error analysis is an indication of the contributions of the various error sources to the uncertainties in the determination of the spacecraft solve-for parameters. The model presented gives the uncertainty due to errors in the a priori estimates of the solve-for parameters, the uncertainty due to measurement noise, the uncertainty due to dynamic noise (also known as process noise or measurement noise), the uncertainty due to the consider parameters, and the overall uncertainty due to all these sources of error

    GOES dynamic propagation of attitude

    Get PDF
    The spacecraft in the next series of Geostationary Operational Environmental Satellites (GOES-Next) are Earth pointing and have 5-year mission lifetimes. Because gyros can be depended on only for a few years of continuous use, they will be turned off during routine operations. This means attitude must, at times, be determined without benefit of gyros and, often, using only Earth sensor data. To minimize the interruption caused by dumping angular momentum, these spacecraft have been designed to reduce the environmental torque acting on them and incorporate an adjustable solar trim tab for fine adjustment. A new support requirement for GOES-Next is that of setting the solar trim tab. Optimizing its setting requires an estimate of the unbalanced torque on the spacecraft. These two requirements, determining attitude without gyros and estimating the external torque, are addressed by replacing or supplementing the gyro propagation with a dynamic one, that is, one that integrates the rigid body equations of motion. By processing quarter-orbit or longer batches, this approach takes advantage of roll-yaw coupling to observe attitude completely without Sun sensor data. Telemetered momentum wheel speeds are used as observations of the unbalanced external torques. GOES-Next provides a unique opportunity to study dynamic attitude propagation. The geosynchronous altitude and adjustable trim tab minimize the external torque and its uncertainty, making long-term dynamic propagation feasible. This paper presents the equations for dynamic propagation, an analysis of the environmental torques, and an estimate of the accuracies obtainable with the proposed method

    UML Consistency Rules:a Case Study with Open-Source UML Models

    Get PDF
    UML models are standard artifacts used by software engineers for designing software. As software is designed, different UML diagram types (e.g., class diagrams and sequence diagrams) are produced by software designers. Since the various UML diagram types describe different aspects of a software system, they are not independent but strongly depend on each other, hence they must be consistent. Inconsistencies cause faults in the final software systems. It is, therefore, paramount that they get detected, analyzed, and fixed. Consistency rules are a useful tool proposed in the literature to detect inconsistencies. They categorize constraints that help in identifying inconsistencies when violated. This case study aims at collecting and analyzing UML models with OCL consistency rules proposed in the literature and at promoting the development of a reference benchmark that can be reused by the (FM-)research community. We collected 33 UML consistency rules and 206 different UML diagrams contained in 34 open-source UML models presented in the literature. We propose an FM-based encoding of the consistency rules in OCL. This encoding allows analyzing whether the consistency rules are satisfied or violated within the 34 UML models. To assess the proposed benchmark, we analyzed how the UML models, consistency rules, diagram types contained in the benchmark help in assessing the consistency of UML models, and the consistency of diagrams across the different software development phases. Our results show that the considered UML models and consistency rules allowed identifying 2731 inconsistencies and that those inconsistencies refer to different software development phases. We concluded that the considered UML models and consistency rules could be considered as an initial benchmark that can be further extended by the research community

    Consistent histories of systems and measurements in spacetime

    Full text link
    Traditional interpretations of quantum theory in terms of wave function collapse are particularly unappealing when considering the universe as a whole, where there is no clean separation between classical observer and quantum system and where the description is inherently relativistic. As an alternative, the consistent histories approach provides an attractive "no collapse" interpretation of quantum physics. Consistent histories can also be linked to path-integral formulations that may be readily generalized to the relativistic case. A previous paper described how, in such a relativistic spacetime path formalism, the quantum history of the universe could be considered to be an eignestate of the measurements made within it. However, two important topics were not addressed in detail there: a model of measurement processes in the context of quantum histories in spacetime and a justification for why the probabilities for each possible cosmological eigenstate should follow Born's rule. The present paper addresses these topics by showing how Zurek's concepts of einselection and envariance can be applied in the context of relativistic spacetime and quantum histories. The result is a model of systems and subsystems within the universe and their interaction with each other and their environment.Comment: RevTeX 4; 37 pages; v2 is a revision in response to reviewer comments, connecting the discussion in the paper more closely to consistent history concepts; v3 has minor editorial corrections; accepted for publication in Foundations of Physics; v4 has a couple minor typographical correction

    Foundations of a spacetime path formalism for relativistic quantum mechanics

    Full text link
    Quantum field theory is the traditional solution to the problems inherent in melding quantum mechanics with special relativity. However, it has also long been known that an alternative first-quantized formulation can be given for relativistic quantum mechanics, based on the parametrized paths of particles in spacetime. Because time is treated similarly to the three space coordinates, rather than as an evolution parameter, such a spacetime approach has proved particularly useful in the study of quantum gravity and cosmology. This paper shows how a spacetime path formalism can be considered to arise naturally from the fundamental principles of the Born probability rule, superposition, and Poincar\'e invariance. The resulting formalism can be seen as a foundation for a number of previous parametrized approaches in the literature, relating, in particular, "off-shell" theories to traditional on-shell quantum field theory. It reproduces the results of perturbative quantum field theory for free and interacting particles, but provides intriguing possibilities for a natural program for regularization and renormalization. Further, an important consequence of the formalism is that a clear probabilistic interpretation can be maintained throughout, with a natural reduction to non-relativistic quantum mechanics.Comment: RevTex 4, 42 pages; V6 is as accepted for publication in the Journal of Mathematical Physics, updated in response to referee comments; V7 includes final editorial correction
    corecore